Max-Min Posterior Pseudo-Probabilities Estimation of Posterior Class Probabilities to Maximize Class Separability
نویسندگان
چکیده
The estimation of the posterior class probabilities is desirable for optimal decision, decision confidence measure, and accurate performance evaluation of a classifier. In this paper, we address the problem of estimating posterior class probabilities by learning from samples for producing optimal classifiers. We introduce a posterior pseudo-probability function based on Bayes’ formula to transform class-conditional probability values into posterior class probability values. The parameters in this function are trained from the data using the proposed max-min posterior pseudo-probability method in which the optimization objective is to maximize the posterior pseudo-probabilities for positive samples, at the same time to minimize those for negative samples. We show the effectiveness of our estimator in learning pattern classification through the experimental results in three applications: content-based image retrieval, text extraction, and handwritten digit recognition.
منابع مشابه
A Soft Target Learning Method of Posterior Pseudo-probabilities Based Classifiers with Its Application to Handwritten Digit Recognition
This paper proposes a soft target discriminative learning method for posterior pseudo-probabilities based classification. The empirical loss is measured based on two soft targets which are corresponding with positive samples and negative samples of the class. The learning objective is to minimize empirical loss and maximize the difference between two soft targets. Consequently, we obtain unknow...
متن کاملEstimation of Posterior Probabilities with Neural Networks: Application to Microcalcification Detection in Breast Cancer Diagnosis
Neural networks (NNs) are customarily used as classifiers aimed at minimizing classification error rates. However, it is known that the NN architectures that compute soft decisions can be used to estimate posterior class probabilities; sometimes, it could be useful to implement general decision rules other than the maximum a posteriori (MAP) decision criterion. In addition, probabilities provid...
متن کاملClassifier Conditional Posterior Probabilities
Classifiers based on probability density estimates can be used to find posterior probabilities for the objects to be classified. These probabilities can be used for rejection or for combining classifiers. Posterior probabilities for other classifiers, however, have to be conditional for the classifier., i.e. they yield class probabilities for a given value of the classifier outcome instead for ...
متن کاملQuantized Posterior Hashing: Efficient Posterior Exemplar Search Exploiting Class-Specific Sparsity Structures
This paper shows that exemplar-based speech processing using class-conditional posterior probabilities admits a highly effective search strategy relying on posteriors’ intrinsic sparsity structures. The posterior probabilities are estimated for phonetic and phonological classes using deep neural network (DNN) computational framework. Exploiting the class-specific sparsity leads to a simple quan...
متن کاملVisual nonlinear discriminant analysis for classifier design
We present a new method for analyzing classifiers by visualization, which we call visual nonlinear discriminant analysis. Classifiers that output posterior probabilities are visualized by embedding samples and classes so as to approximate posterior probabilities using parametric embedding. The visualization provides a better intuitive understanding of such classifier characteristics as separabi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006